A contribution to the theory of Chernoff bounds

نویسندگان

  • James A. Bucklew
  • John S. Sadowsky
چکیده

In estimating the unknown location of a rectangular signal observed with white noise, the asymptotic risks of three important estimators are compared under L1/L2 losses. A different numerical scheme is used to improve the accuracy of Ibragimov/Hasminskii’s result, which also leads to further information and numerical comparisons about the problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Chernoff bounds on pairwise error probabilities of space-time codes

We derive Chernoff bounds on pairwise error probabilities of coherent and noncoherent space-time signaling schemes. First, general Chernoff bound expressions are derived for a correlated Ricean fading channel and correlated additive Gaussian noise. Then, we specialize the obtained results to the cases of space-timeseparable noise, white noise, and uncorrelated fading. We derive approximate Cher...

متن کامل

Probability of error, equivocation, and the Chernoff bound

-I4 ET US consider the usual decision-theory problem of classifying an observation X ss coming from one of m possible classes (hypotheses) C,, CZ, * , C,. Let nl, .a * , 7~,,, denote the a priori probabilities on the hypotheses, and let p,(x), * * * , p,(x) denote the conditional probability density functions given the true hypothesis. Let us assume that these are known. Then it is well knoti t...

متن کامل

Geometric Applications of Chernoff-type Estimates

In this paper we present a probabilistic approach to some geometric problems in asymptotic convex geometry. The aim of this paper is to demonstrate that the well known Chernoff bounds from probability theory can be used in a geometric context for a very broad spectrum of problems, and lead to new and improved results. We begin by briefly describing Chernoff bounds, and the way we will use them....

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 39  شماره 

صفحات  -

تاریخ انتشار 1993